-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move learning rate and releated op to pserver #8209
Conversation
persistable=var.persistable, | ||
dtype=var.dtype, | ||
shape=var.shape) | ||
pserver_program.global_block().create_var( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can add a FIXME here, should only create var once.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
|
||
def _is_opt_op(self, op): | ||
# optimize op: SGDOptimize, MomentumOptimizer, AdamOptimizer and etc... | ||
if op.inputs and op.inputs.has_key("Param") \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add comment HACK, this is a hack.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
opt_op_on_pserver.append(op) | ||
|
||
for _, op in enumerate(self.optimize_ops): | ||
if not True in [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not True in []
is still confusing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
pserver_program.global_block().create_var( | ||
name="%s.trainer_%d" % (v.name, trainer_id), | ||
persistable=True, | ||
dtype=v.dtype, | ||
shape=v.shape) | ||
# step6 | ||
optimize_sub_program = Program() | ||
ufind = self._create_ufind(self.optimize_ops) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add comments or document about why we need ufind
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM++
I'll separate return all optimize and related ops feature into another PR #8280 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM++
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM++
Fixed #8032